IS

Klein, Barbara D.

Topic Weight Topic Terms
0.141 human awareness conditions point access humans images accountability situational violations result reduce moderation gain people
0.135 errors error construction testing spreadsheet recovery phase spreadsheets number failures inspection better studies modules rate
0.128 theory theories theoretical paper new understanding work practical explain empirical contribution phenomenon literature second implications
0.117 detection deception assessment credibility automated fraud fake cues detecting results screening study detect design indicators

Focal Researcher     Coauthors of Focal Researcher (1st degree)     Coauthors of Coauthors (2nd degree)

Note: click on a node to go to a researcher's profile page. Drag a node to reallocate. Number on the edge is the number of co-authorships.

Davis, Gordon B. 1 Goodhue, Dale L. 1
Information attributes 1 user behavior data 1

Articles (1)

Can Humans Detect Errors in Data? Impact of Base Rates, Incentives, and Goals. (MIS Quarterly, 1997)
Authors: Abstract:
    There is strong evidence that data items stored in organizational databases have a significant rate of errors. If undetected in uses those errors in stored data may significantly affect business outcomes. Published research suggests that users of information systems tend to be ineffective in detecting data errors. However, in this paper it is argued that, rather than accepting poor human error detection performance, MIS researchers need to develop better theories of human error detection and to improve their understanding of the conditions for improving performance. This paper applies several theory bases (primarily signal detection theory but also a theory of individual task performance, theories of effort and accuracy in decision making, and theories of goals and incentives) to develop a set of propositions about successful human error detection. These propositions are tested in a laboratory setting. The results present a strong challenge to earlier assertions that humans are poor detectors of data errors. The findings of the two laboratory experiments show that explicit error detection goals and incentives can modify error detection performance. These findings provide an improved understanding of conditions under which users detect data errors. They indicate it is possible to influence detection behavior in organizational settings through managerial directives, training, and incentives.